Security News
GitHub Removes Malicious Pull Requests Targeting Open Source Repositories
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
fs-capacitor
Advanced tools
Filesystem-buffered, passthrough stream that buffers indefinitely rather than propagate backpressure from downstream consumers.
The fs-capacitor npm package is designed to handle file streams efficiently, particularly in the context of handling file uploads in Node.js applications. It provides a way to manage temporary files and streams, ensuring that resources are properly cleaned up after use.
Creating a Write Stream
This feature allows you to create a write stream where you can write data to a temporary file. The stream can be used to handle file uploads or other data streams efficiently.
const { WriteStream } = require('fs-capacitor');
const writeStream = new WriteStream();
writeStream.write('Hello, World!');
writeStream.end();
Reading from a Write Stream
This feature allows you to create a read stream from a write stream, enabling you to read the data that was written to the temporary file. This is useful for processing uploaded files or other streamed data.
const { WriteStream } = require('fs-capacitor');
const writeStream = new WriteStream();
writeStream.write('Hello, World!');
writeStream.end();
writeStream.createReadStream().pipe(process.stdout);
Handling Errors
This feature allows you to handle errors that occur during the streaming process. By listening to the 'error' event, you can catch and handle any issues that arise while writing to or reading from the stream.
const { WriteStream } = require('fs-capacitor');
const writeStream = new WriteStream();
writeStream.on('error', (err) => {
console.error('Stream error:', err);
});
writeStream.write('Hello, World!');
writeStream.end();
Multer is a middleware for handling multipart/form-data, which is primarily used for uploading files. It is similar to fs-capacitor in that it handles file uploads, but it is more focused on integrating with Express.js applications and provides more features for handling different types of file uploads.
Busboy is a streaming parser for HTML form data for node.js. It is similar to fs-capacitor in that it handles file streams, but it is more low-level and provides more control over the parsing process. It is often used in conjunction with other libraries to handle file uploads.
Formidable is a Node.js module for parsing form data, especially file uploads. It is similar to fs-capacitor in that it handles file uploads, but it provides more features for parsing and handling different types of form data.
FS Capacitor is a filesystem buffer for finite node streams. It supports simultaneous read/write, and can be used to create multiple independent readable streams, each starting at the beginning of the buffer.
This is useful for file uploads and other situations where you want to avoid delays to the source stream, but have slow downstream transformations to apply:
import fs from "fs";
import http from "http";
import WriteStream from "fs-capacitor";
http.createServer((req, res) => {
const capacitor = new WriteStream();
const destination = fs.createReadStream("destination.txt");
// pipe data to the capacitor
req.pipe(capacitor);
// read data from the capacitor
capacitor
.createReadStream()
.pipe(/* some slow Transform streams here */)
.pipe(destination);
// read data from the very beginning
setTimeout(() => {
capacitor.createReadStream().pipe(/* elsewhere */);
// you can destroy a capacitor as soon as no more read streams are needed
// without worrying if existing streams are fully consumed
capacitor.destroy();
}, 100);
});
It is especially important to use cases like graphql-upload
where server code may need to stash earler parts of a stream until later parts have been processed, and needs to attach multiple consumers at different times.
FS Capacitor creates its temporary files in the directory ideneified by os.tmpdir()
and attempts to remove them:
readStream.destroy()
has been called and all read streams are fully consumed or destroyedPlease do note that FS Capacitor does NOT release disk space as data is consumed, and therefore is not suitable for use with infinite streams or those larger than the filesystem.
WriteStream
inherets all the methods of fs.WriteStream
new WriteStream()
Create a new WriteStream
instance.
.createReadStream(): ReadStream
Create a new ReadStream
instance attached to the WriteStream
instance.
Once a WriteStream
is fully destroyed, calling .createReadStream()
will throw a ReadAfterDestroyedError
error.
As soon as a ReadStream
ends or is closed (such as by calling readStream.destroy()
), it is detached from its WriteStream
.
.destroy(error?: ?Error): void
error
is present, WriteStream
s still attached are destroyed with the same error.error
is null or undefined, destruction of underlying resources is delayed until no ReadStream
s are attached the WriteStream
instance.ReadStream
inherets all the methods of fs.ReadStream
.
2.0.4
--experimental-modules
mode that was published in v2.0.2 that broke compatibility with earlier Node.js versions and test both ESM and CJS builds (skipping --experimental-modules
tests for Node.js v12), via #11.browserslist
field instead of configuring @babel/preset-env
directly.@babel/preset-env
to use shipped proposals and loose mode..json
extensions so they can be Prettier linted.lib
directory; it's meant to be pretty.package.json
and package-lock.json
so npm can own the formatting.eslint-plugin-node
to resolve .mjs
before .js
and other extensions, for compatibility with the pre Node.js v12 --experimental-modules
behavior.node_modules
, as it's already ignored by default.classic
TAP reporter for tests as it has more compact output.FAQs
Filesystem-buffered, passthrough stream that buffers indefinitely rather than propagate backpressure from downstream consumers.
The npm package fs-capacitor receives a total of 0 weekly downloads. As such, fs-capacitor popularity was classified as not popular.
We found that fs-capacitor demonstrated a not healthy version release cadence and project activity because the last version was released a year ago. It has 1 open source maintainer collaborating on the project.
Did you know?
Socket for GitHub automatically highlights issues in each pull request and monitors the health of all your open source dependencies. Discover the contents of your packages and block harmful activity before you install or update your dependencies.
Security News
GitHub removed 27 malicious pull requests attempting to inject harmful code across multiple open source repositories, in another round of low-effort attacks.
Security News
RubyGems.org has added a new "maintainer" role that allows for publishing new versions of gems. This new permission type is aimed at improving security for gem owners and the service overall.
Security News
Node.js will be enforcing stricter semver-major PR policies a month before major releases to enhance stability and ensure reliable release candidates.